Teaching Yourself: A Self-Knowledge Distillation Approach to Action Recognition
نویسندگان
چکیده
Knowledge distillation, which is a process of transferring complex knowledge learned by heavy network, i.e., teacher, to lightweight student, has emerged as an effective technique for compressing neural networks. To reduce the necessity training large teacher this paper leverages recent self-knowledge distillation approach train student network progressively distilling its own without pre-trained network. Far from existing methods, mainly focus on still images, our proposed Teaching Yourself that targets at videos human action recognition. Our not only designed but also high generalization capability model. In approach, able update itself using best past model, termed preceding then utilized guide present Inspired consistency in state-of-the-art semi-supervised learning we introduce augmentation strategy increase data diversity and improve consistent predictions approach. benchmark been conducted both 3D Resnet-18 ResNet-50 backbone networks evaluated various standard datasets such UCF101, HMDB51, Kinetics400 datasets. The experimental results have shown teaching yourself method significantly improves recognition performance terms accuracy compared supervised methods. We expensive ablation study demonstrate mitigates overconfident dark generates more input variations same point. code available https://github.com/vdquang1991/Self-KD.
منابع مشابه
A Knowledge-Based Approach to Goal Recognition
We tackle in this paper the problem known as plan recognition. Our aim is to try to restudy the problem, paying particular attention to the underlying reasoning processes. We rename the problem as goal recognition, explain the details of the reasoning steps that are required for it, and argue in particular that forward reasoning is the central process in goal recognition though it needs to be s...
متن کاملSurvey the Knowledge Level of Senior Nursing Students on Teaching Self Care to Cancer Patients
چکیده: این پژوهش یک مطالعه توصیفی است که به منظور تعیین میزان آگاهی دانشجویان سال آخر پرستاری دانشکده های پرستاری مامائی وابسته به دانشگاه علوم پزشکی گیلان در رابطه با آموزش نکات مراقبتی به بیماران سرطانی انجام شده است. تعداد نمونه های مورد بررسی ۱۰۸ نفر و ابزار گردآوری داده ها پرسشنامه ای مشتمل بر ۴۰ سئوال در دو بخش مربوط به آموزش نکات مراقبتی به بیماران تحت شیمی درمانی و رادیوتراپی بود نت...
متن کاملA Distillation Approach to Refining Learning Objects
McCalla's ecological approach to e-learning systems [2] is described as “attaching models of learners to the learning objects they interact with, and then mining these models for patterns that are useful for various purposes.” The starting point of our research is to honour McCalla's ecological approach by identifying which users in a system are similar to each other, to then preferentially rec...
متن کاملA Systematic Approach to Distillation Column Control
This paper presents a systematic approach to distillation column control. The main emphasis is on the steps which precede the actual controller design, namely the modelling of the column and the selection of the control con guration. By control con guration in this context we mean the two independent variables used for composition control (for example, L and V;D and V , or L D and V B ). The st...
متن کاملA Systemic Approach to Improving Teaching and Learning
This paper describes one university’s approach to improving the quality of teaching and learning at the institutional level, based on the premise of improving the design of curriculum rather than focusing on the skills of teachers as such. The paper describes the process by which university-wide principles of curriculum design were defined and agreed, as well as the parallel campaigns needed to...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Access
سال: 2021
ISSN: ['2169-3536']
DOI: https://doi.org/10.1109/access.2021.3099856